Skip to content

Conversation

@sofiagiappichini
Copy link
Contributor

A new producer is introduced to read PU weights from ROOT files, instead of using correctionlib with JSONs. This is adapted from the TauFW framework and validated for 2024 PU.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces a new producer function PUWeightROOT to calculate pileup reweighting by directly reading from ROOT histogram files, providing an alternative to the existing correctionlib JSON-based approach. The implementation is adapted from the TauFW framework and has been validated for 2024 pileup conditions.

Changes:

  • Added PUWeightROOT function to read pileup weights from ROOT files containing data and MC pileup distributions
  • Added function declaration in the header file
  • Updated jsonpog-integration submodule reference

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 4 comments.

File Description
src/reweighting.cxx Implements new PUWeightROOT function with histogram-based PU weight calculation and adds clarifying note about ZPtMass function usage
include/reweighting.hxx Adds function declaration for the new PUWeightROOT producer
data/jsonpog-integration Updates submodule commit reference

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +76 to +79
TFile* datafile = TFile::Open(datafilename.c_str(), "READ");
TFile* mcfile = TFile::Open(mcfilename.c_str(), "READ");
TH1D* datahist = (TH1D*)datafile->Get(histname.c_str());
TH1D* mchist = (TH1D*)mcfile->Get(histname.c_str());
Copy link

Copilot AI Jan 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

File and histogram retrieval lack null pointer checks. If files fail to open or histograms don't exist, subsequent operations will cause segmentation faults. Add validation after each TFile::Open and Get call.

Copilot uses AI. Check for mistakes.
delete mcfile;

// Define lambda for event weight
auto puweightlambda = [datahist, mchist](const float truePUMean) {
Copy link

Copilot AI Jan 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The lambda function name 'puweightlambda' is redundant since it's clearly a lambda. Consider renaming to 'calculatePUWeight' or 'getPUWeight' for clarity.

Copilot uses AI. Check for mistakes.
float mc = mchist->GetBinContent(mcBin);
if (mc > 0.0) {
float ratio = data / mc;
if (ratio > 5.0) return 5.0f;
Copy link

Copilot AI Jan 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The magic number 5.0 for the maximum weight ratio lacks explanation. Add a comment explaining why this cap is necessary or define it as a named constant.

Copilot uses AI. Check for mistakes.
Comment on lines +80 to +87
datahist->SetDirectory(0);
mchist->SetDirectory(0);
datahist->Scale(1.0 / datahist->Integral());
mchist->Scale(1.0 / mchist->Integral());
datafile->Close();
mcfile->Close();
delete datafile;
delete mcfile;
Copy link

Copilot AI Jan 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Histograms are detached from their files with SetDirectory(0) but never deleted, causing a memory leak. The histograms captured by the lambda will persist for the lifetime of the DataFrame. Consider using std::unique_ptr or document that this is intentional for the lambda's lifetime.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants